Methods Inf Med 2012; 51(02): 122-130
DOI: 10.3414/ME10-01-0066
Original Articles
Schattauer GmbH

Health Services Research Evaluation Principles

Broadening a General Framework for Evaluating Health Information Technology
P. S. Sockolow
1   College of Nursing and Health Professions, Drexel University, Philadelphia, PA, USA
,
P. R. Crawford
2   Division of Health Sciences Informatics, School of Medicine, The Johns Hopkins University, Baltimore, MD, USA
3   Department of International Health, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
,
H. P. Lehmann
2   Division of Health Sciences Informatics, School of Medicine, The Johns Hopkins University, Baltimore, MD, USA
4   Department of Health Policy and Management, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
5   Department of Biostatistics, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
› Author Affiliations
Further Information

Publication History

received:09 September 2010

accepted:04 February 2011

Publication Date:
19 January 2018 (online)

Preview

Summary

Background: Our forthcoming national experiment in increased health information technology (HIT) adoption funded by the American Recovery and Reinvestment Act of 2009 will require a comprehensive approach to evaluating HIT. The quality of evaluation studies of HIT to date reveals a need for broader evaluation frameworks that limits the generalizability of findings and the depth of lessons learned.

Objective: Develop an informatics evaluation framework for health information technology (HIT) integrating components of health services research (HSR) evaluation and informatics evaluation to address identified shortcomings in available HIT evaluation frameworks.

Method: A systematic literature review updated and expanded the exhaustive review by Ammenwerth and deKeizer (AdK). From retained studies, criteria were elicited and organized into classes within a framework. The resulting Health Information Technology Research-based Evaluation Framework (HITREF) was used to guide clinician satisfaction survey construction, multi-dimensional analysis of data, and interpretation of findings in an evaluation of a vanguard community health care EHR.

Results: The updated review identified 128 electronic health record (EHR) evaluation studies and seven evaluation criteria not in AdK: EHR Selection/Development/Training; Patient Privacy Concerns; Unintended Consequences/ Benefits; Functionality; Patient Satisfaction with EHR; Barriers/Facilitators to Adoption; and Patient Satisfaction with Care. HITREF was used productively and was a complete evaluation framework which included all themes that emerged.

Conclusions: We can recommend to future EHR evaluators that they consider adding a complete, research-based HIT evaluation framework, such as HITREF, to their evaluation tools suite to monitor HIT challenges as the federal government strives to increase HIT adoption.